Bicomplex Neural Networks with Hypergeometric Activation Functions

نویسندگان

چکیده

Abstract Bicomplex convolutional neural networks (BCCNN) are a natural extension of the quaternion for bicomplex case. As it happens with quaternionic case, BCCNN has capability learning and modelling external dependencies that exist between neighbour features an input vector internal latent within feature. This property arises from fact that, under certain circumstances, is possible to deal number in component-wise way. In this paper, we present BCCNN, apply classification task involving colourized version well-known dataset MNIST. Besides novelty considering numbers, our CNN considers activation function Bessel-type function. see, results better compared one where classical ReLU considered.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Neural Networks with Monotonic Activation Functions

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of t...

متن کامل

Deep Neural Networks with Multistate Activation Functions

We propose multistate activation functions (MSAFs) for deep neural networks (DNNs). These MSAFs are new kinds of activation functions which are capable of representing more than two states, including the N-order MSAFs and the symmetrical MSAF. DNNs with these MSAFs can be trained via conventional Stochastic Gradient Descent (SGD) as well as mean-normalised SGD. We also discuss how these MSAFs p...

متن کامل

A Subclass of Analytic Functions Associated with Hypergeometric Functions

In the present paper, we have established sufficient conditions for Gaus-sian hypergeometric functions to be in certain subclass of analytic univalent functions in the unit disc $mathcal{U}$. Furthermore, we investigate several mapping properties of Hohlov linear operator for this subclass and also examined an integral operator acting on hypergeometric functions.

متن کامل

Complex-valued Neural Networks with Non-parametric Activation Functions

Complex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers. However, several analytical properties of the complex domain (e.g., holomorphicity) make the design of CVNNs a more challenging task than their real counterpart. In this paper, we consider the problem of flexible activation functions (AFs) in the c...

متن کامل

Recurrent neural networks with trainable amplitude of activation functions

An adaptive amplitude real time recurrent learning (AARTRL) algorithm for fully connected recurrent neural networks (RNNs) employed as nonlinear adaptive filters is proposed. Such an algorithm is beneficial when dealing with signals that have rich and unknown dynamical characteristics. Following the approach from, three different cases for the algorithm are considered; a common adaptive amplitu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in Applied Clifford Algebras

سال: 2023

ISSN: ['0188-7009', '1661-4909']

DOI: https://doi.org/10.1007/s00006-023-01268-w